Adaptive Kernel Based Machine Learning Methods

نویسنده

  • YUESHENG XU
چکیده

During the support period July 1, 2011 June 30, 2012, seven research papers were published. They consist of three types: • Research that directly addresses the kernel selection problem in machine learning [1, 2]. • Research that closely relates to the fundamental issues of the proposed research of this grant [3, 4, 5, 6]. • Research that is in the general context of computational mathematics [7]. Paper [1] studies the construction of a refinement kernel for a given operator-valued reproducing kernel such that the vector-valued reproducing kernel Hilbert space of the refinement kernel contains that of the given kernel as a subspace. The study is motivated from the need of updating the current operator-valued reproducing kernel in multi-task learning when underfitting or overfitting occurs. Numerical simulations confirm that the established refinement kernel method is able to meet this need. Various characterizations are provided based on feature maps and vector-valued integral representations of operator-valued reproducing kernels. Concrete examples of refining translation invariant and finite Hilbert-Schmidt operator-valued reproducing kernels are provided. Other examples include refinement of Hessian of scalar-valued translation-invariant kernels and transformation kernels. Existence and properties of operator-valued reproducing kernels preserved during the refinement process are also investigated. Motivated by the importance of kernel-based methods for multi-task learning, we provide in [2] a complete characterization of multi-task finite rank kernels in terms of the positivity of what we call its associated characteristic operator. Consequently, we are led to establishing that every continuous multitask kernel, defined on a cube in an Euclidean space, not only can be uniformly approximated by multi-task polynomial kernels, but also can be extended as a multi-task kernel to all of the Euclidean space. Finally, we discuss the interpolation of multi-task kernels by multi-task finite rank kernels. Multiscale collocation methods are developed in [3] for solving a system of integral equations which is a reformulation of the Tikhonov-regularized second-kind equation of an ill-posed integral equation of the first kind. This problem is closely related to regularization problems in machine learning. Direct numerical solutions of the Tikhonov regularization equation require one to generate a matrix representation of the composition of the conjugate operator with its original integral operator. Generating such a matrix is computationally costly. To overcome this challenging computational issue, rather than directly solving the Tikhonov-regularized equation, we propose to solve an equivalent coupled system of integral equations. We apply a multiscale collocation method with a matrix compression strategy to discretize the system of integral equations and then use the multilevel augmentation method to solve the resulting discrete system. A priori and a posteriori

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

MODELING OF FLOW NUMBER OF ASPHALT MIXTURES USING A MULTI–KERNEL BASED SUPPORT VECTOR MACHINE APPROACH

Flow number of asphalt–aggregate mixtures as an explanatory factor has been proposed in order to assess the rutting potential of asphalt mixtures. This study proposes a multiple–kernel based support vector machine (MK–SVM) approach for modeling of flow number of asphalt mixtures. The MK–SVM approach consists of weighted least squares–support vector machine (WLS–SVM) integrating two kernel funct...

متن کامل

A Review of Kernel Methods in Machine Learning

We review recent methods for learning with positive definite kernels. All these methods formulate learning and estimation problems as linear tasks in a reproducing kernel Hilbert space (RKHS) associated with a kernel. We cover a wide range of methods, ranging from simple classifiers to sophisticated methods for estimation with structured data. (AMS 2000 subject classifications: primary 30C40 Ke...

متن کامل

یادگیری نیمه نظارتی کرنل مرکب با استفاده از تکنیک‌های یادگیری معیار فاصله

Distance metric has a key role in many machine learning and computer vision algorithms so that choosing an appropriate distance metric has a direct effect on the performance of such algorithms. Recently, distance metric learning using labeled data or other available supervisory information has become a very active research area in machine learning applications. Studies in this area have shown t...

متن کامل

A note on extension theorems and its connection to universal consistency in machine learning

Statistical machine learning plays an important role in modern statistics and computer science. One main goal of statistical machine learning is to provide universally consistent algorithms, i.e., the estimator converges in probability or in some stronger sense to the Bayes risk or to the Bayes decision function. Kernel methods based on minimizing the regularized risk over a reproducing kernel ...

متن کامل

Online Adaptive Machine Learning Based Algorithm for Implied Volatility Surface Modeling

In this work, we design a machine learning based method – online adaptive primal support vector regression (SVR) – to model the implied volatility surface. The algorithm proposed is the first derivation and implementation of an online primal kernel SVR. It features enhancements that allow online adaptive learning by embedding the idea of local fitness and budget maintenance. To accelerate our a...

متن کامل

Kernel CCA Based Transfer Learning for Software Defect Prediction

An transfer learning method, called Kernel Canonical Correlation Analysis plus (KCCA+), is proposed for heterogeneous Crosscompany defect prediction. Combining the kernel method and transfer learning techniques, this method improves the performance of the predictor with more adaptive ability in nonlinearly separable scenarios. Experiments validate its effectiveness. key words: machine learning,...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012